51 research outputs found

    Auditory Spatial Layout

    Get PDF
    All auditory sensory information is packaged in a pair of acoustical pressure waveforms, one at each ear. While there is obvious structure in these waveforms, that structure (temporal and spectral patterns) bears no simple relationship to the structure of the environmental objects that produced them. The properties of auditory objects and their layout in space must be derived completely from higher level processing of the peripheral input. This chapter begins with a discussion of the peculiarities of acoustical stimuli and how they are received by the human auditory system. A distinction is made between the ambient sound field and the effective stimulus to differentiate the perceptual distinctions among various simple classes of sound sources (ambient field) from the known perceptual consequences of the linear transformations of the sound wave from source to receiver (effective stimulus). Next, the definition of an auditory object is dealt with, specifically the question of how the various components of a sound stream become segregated into distinct auditory objects. The remainder of the chapter focuses on issues related to the spatial layout of auditory objects, both stationary and moving

    Value Encoding in Single Neurons in the Human Amygdala during Decision Making

    Get PDF
    A growing consensus suggests that the brain makes simple choices by assigning values to the stimuli under consideration and then comparing these values to make a decision. However, the network involved in computing the values has not yet been fully characterized. Here, we investigated whether the human amygdala plays a role in the computation of stimulus values at the time of decision making. We recorded single neuron activity from the amygdala of awake patients while they made simple purchase decisions over food items. We found 16 amygdala neurons, located primarily in the basolateral nucleus that responded linearly to the values assigned to individual items

    Auditory space expansion via linear filtering

    Full text link

    Common Fronto-temporal Effective Connectivity in Humans and Monkeys

    Get PDF
    Human brain pathways supporting language and declarative memory are thought to have differentiated substantially during evolution. However, cross-species comparisons are missing on site-specific effective connectivity between regions important for cognition. We harnessed functional imaging to visualize the effects of direct electrical brain stimulation in macaque monkeys and human neurosurgery patients. We discovered comparable effective connectivity between caudal auditory cortex and both ventro-lateral prefrontal cortex (VLPFC, including area 44) and parahippocampal cortex in both species. Human-specific differences were clearest in the form of stronger hemispheric lateralization effects. In humans, electrical tractography revealed remarkably rapid evoked potentials in VLPFC following auditory cortex stimulation and speech sounds drove VLPFC, consistent with prior evidence in monkeys of direct auditory cortex projections to homologous vocalization-responsive regions. The results identify a common effective connectivity signature in human and nonhuman primates, which from auditory cortex appears equally direct to VLPFC and indirect to the hippocampus

    Real-Time Contrast Enhancement to Improve Speech Recognition

    Get PDF
    An algorithm that operates in real-time to enhance the salient features of speech is described and its efficacy is evaluated. The Contrast Enhancement (CE) algorithm implements dynamic compressive gain and lateral inhibitory sidebands across channels in a modified winner-take-all circuit, which together produce a form of suppression that sharpens the dynamic spectrum. Normal-hearing listeners identified spectrally smeared consonants (VCVs) and vowels (hVds) in quiet and in noise. Consonant and vowel identification, especially in noise, were improved by the processing. The amount of improvement did not depend on the degree of spectral smearing or talker characteristics. For consonants, when results were analyzed according to phonetic feature, the most consistent improvement was for place of articulation. This is encouraging for hearing aid applications because confusions between consonants differing in place are a persistent problem for listeners with sensorineural hearing loss

    Directional influence between the human amygdala and orbitofrontal cortex at the time of decision-making.

    No full text
    There is a growing consensus that the brain makes simple choices, such as choosing between an apple and an orange, by assigning value to the options under consideration, and comparing those values to make a choice. There is also a consensus that value signals computed in orbitofrontal cortex (OFC) and amygdala play a critical role in the choice process. However, the nature of the flow of information between OFC and amygdala at the time of decision is still unknown. In order to study this question, simultaneous local field potentials were recorded from OFC and amygdala in human patients while they performed a simple food choice task. Although the interaction of these circuits has been studied in animals, this study examines the effective connectivity directly in the human brain on a moment-by-moment basis. A spectral conditional Granger causality analysis was performed in order to test if the modulation of activity goes mainly from OFC-to-amygdala, from amygdala-to-OFC, or if it is bi-directional. Influence from amygdala-to-OFC was dominant prior to the revealed choice, with a small but significant OFC influence on the amygdala earlier in the trial. Alpha oscillation amplitudes analyzed with the Hilbert-Huang transform revealed differences in choice valence coincident with temporally specific amygdala influence on the OFC
    corecore